12 research outputs found

    Simulating Tail Probabilities in GI/GI.1 Queues and Insurance Risk Processes with Subexponentail Distributions

    Get PDF
    This paper deals with estimating small tail probabilities of thesteady-state waiting time in a GI/GI/1 queue with heavy-tailed (subexponential) service times. The problem of estimating infinite horizon ruin probabilities in insurance risk processes with heavy-tailed claims can be transformed into the same framework. It is well-known that naive simulation is ineffective for estimating small probabilities and special fast simulation techniques like importance sampling, multilevel splitting, etc., have to be used. Though there exists a vast amount of literature on the rare event simulation of queuing systems and networks with light-tailed distributions, previous fast simulation techniques for queues with subexponential service times have been confined to the M/GI/1 queue. The general approach is to use the Pollaczek-Khintchine transformation to convert the problem into that of estimating the tail distribution of a geometric sum of independent subexponential random variables. However, no such useful transformation exists when one goes from Poisson arrivals to general interarrival-time distributions. We describe and evaluate an approach that is based on directly simulating the random walk associated with the waiting-time process of the GI/GI/1 queue, using a change of measure called delayed subexponential twisting -an importance sampling idea recently developed and found useful in the context of M/GI/1 heavy-tailed simulations

    A Multiserver Queueing System with Impatient Customers

    No full text
    Many real-world situations involve queueing systems in which customers wait for service for a limited time only and leave the system if service has not begun within that time. This paper considers a multiserver queueing system with impatient customers, where the customers arrive according to a Poisson process and the service requirements have a general distribution. A simple and insightful solution is presented for the loss probability. The solution is exact for exponential services and is an excellent heuristic for general service times.M/G/c queue, impatience, loss probability, heuristic

    The shape of the loss curve and the impact of long-range dependence on network performance

    Get PDF
    Empirical studies showed that many types of network traffic exhibit long-range dependence (LRD), i.e., burstiness on a wide variety of time-scales. Given that traffic streams are indeed endowed with LRD properties, a next question is: what is their impact on network performance?\ud \ud To assess this issue, we consider a generic source model: traffic generated by an individual user is modeled as a fluid on/off pattern with generally distributed on- and off-times; LRD traffic is obtained by choosing the on-times heavy-tailed. We focus on an aggregation of many i.i.d. sources, say n, multiplexed on a FIFO queue, with the queueing resources scaled accordingly. Large deviations analysis says that the (steady-state) overflow probability decays exponentially in n; we call the corresponding decay rate, as a function of the buffer size B, the loss curve.\ud \ud To get insight into the influence of the distribution of the on- and off-times, we list the most significant properties of the loss curve. Strikingly, for small B, the decay rate depends on the distributions only through their means. For large B there is no such insensitivity property. In case of heavy-tailed on-times, the decay of the loss probability in the buffer size is slower than exponential; this is in stark contrast with light-tailed on-times, in which case this decay is at least exponential.\ud \ud To assess the sensitivity of the performance metrics to the probabilistic properties of the input, we compute the loss curve for a number of representative examples (voice, video, file transfer, web browsing, etc.), with realistic distributions and parameters.\ud \ud Our conclusions on the impact of LRD on the performance can be summarized as follows: (1) If the maximally tolerable delay is relatively small, there is hardly any difference between heavy-tailed and light-tailed inputs; this gives a theoretical handle on observations that appeared in the literature. Only for very delay tolerant applications the above-mentioned large B results kick in. (2) The level of aggregation is a significant factor. If the ratio between the link rate and the peak rate of a single source is high, a high utilization can be achieved, while at the same time the delay requirements are met; this holds even if the delay requirements are stringent

    Simulating Tail Probabilities in GI/GI.1 Queues and Insurance Risk Processes with Subexponentail Distributions

    No full text
    This paper deals with estimating small tail probabilities of thesteady-state waiting time in a GI/GI/1 queue withheavy-tailed (subexponential) service times. The problem ofestimating infinite horizon ruin probabilities in insurancerisk processes with heavy-tailed claims can be transformed into thesame framework. It is well-known that naivesimulation is ineffective for estimating small probabilities andspecial fast simulation techniques like importancesampling, multilevel splitting, etc., have to be used. Though thereexists a vast amount of literature on the rare eventsimulation of queuing systems and networks with light-taileddistributions, previous fast simulation techniques forqueues with subexponential service times have been confined to theM/GI/1 queue. The general approach is to use thePollaczek-Khintchine transformation to convert the problem into thatof estimating the tail distribution of a geometricsum of independent subexponential random variables. However, no suchuseful transformation exists when one goesfrom Poisson arrivals to general interarrival-time distributions. Wedescribe and evaluate an approach that is based ondirectly simulating the random walk associated with the waiting-timeprocess of the GI/GI/1 queue, using a change ofmeasure called delayed subexponential twisting -an importancesampling idea recently developed and found useful inthe context of M/GI/1 heavy-tailed simulations.importance sampling; rare event simulation; subexponential distributions; insurance risk; GI/GI/1 queues

    Fast Simulation of a Queue fed by a Superposition of Many (Heavy-Tailed) Sources

    No full text
    We consider a queue fed by a large number, say n, of on-off sources with generally distributed on-and off-times. The queueing resources are scaled by n: the buffer is B=nb and link rate is C=nc.The model is versatile: it allows us to model both long range dependent traffic (by using heavy-tailed distributed on-periods) and short range dependent traffic (by using light-tailed on-periods).A crucial performance metric in this model is the steady-state buffer overflow probability.This overflow probability decays exponentially in the number of sources n. Therefore, if thenumber of sources grows large, naive simulation is too time-consuming, and we have to use fastsimulation techniques instead. Due to the exponential decay (in n), importance sampling with an exponential change of measureessentially goes through, irrespective of the on-times being heavy-tailed or light-tailed. Anasymptotically optimal change of measure is found by using large deviations arguments. Notably,the change of measure is not constant during the simulation run, which is essentially differentfrom many other studies (usually relying on large buffer asymptotics).We provide numerical examples to show that the resulting importance sampling procedure indeedimproves considerably over naive simulation. We present some accelerations. Finally, we give shortcomments on the influence of the shape of the distributions on the loss probability, and wedescribe the limitations of our technique.long-range dependence; importance sampling; queueing theory; large deviations asymptotics; buffer overflow; heavy-tailed random variables

    Simulating Ruin Probabilities In Insurance Risk Processes With Subexponential Claims

    No full text
    We describe a fast simulation framework for simulating small ruin probabilities in insurance risk processes with subexponential claims. Naive simulation is inefficient since the event of interest is rare, and special simulation techniques like importance sampling need to be used. An importance sampling change of measure known as sub-exponential twisting has been found useful for some rare event simulations in the subexponential context. We describe conditions that are sufficient to ensure that the infinite horizon probability can be estimated in a (work-normalized) large set asymptotically optimal manner, using this change of measure. These conditions are satisfied for some large classes of insurance risk processes -- e.g., processes with Markov-modulated claim arrivals and claim sizes -- where the heavy tails are of the `Weibull type'. We also give much weaker conditions for the estimation of the finite horizon ruin probability. Finally, we present experiments supporting our results

    The Shape of the Loss Curve and the Impact of Long-Range Dependence on Network Performance

    No full text
    Empirical studies showed that many types of network traffic exhibit long-range dependence (LRD),i.e., burstiness on a wide variety of time-scales. Given that traffic streams are indeed endowed withLRD properties, a next question is: what is their impact on network performance? To assess thisissue, we consider a generic source model: traffic generated by an individual user is modeled as afluid on/off pattern with generally distributed on- and off-times; LRD traffic is obtained bychoosing the on-times heavy-tailed. We focus on an aggregation of many i.i.d. sources, say n,multiplexed on a FIFO queue, with the queueing resources scaled accordingly. Large deviationsanalysis says that the (steady-state) overflow probability decays exponentially in n; we call thecorresponding decay rate, as a function of the buffer size B, the loss curve. To get insight into the influence of the distribution of the on- and off-times, we list the mostsignificant properties of the loss curve. Strikingly, for small B, the decay rate depends on thedistributions it only through their means. For large B there is no such insensitivity property. In caseof heavy-tailed on-times, the decay of theloss probability in the buffer size is slower than exponential; this is in stark contrast with light-tailed on-times, in which case this decay is at least exponential. To assess the sensitivity of theperformance metrics to the probabilistic properties of the input, we compute theloss curve for anumber of representative examples (voice, video, file transfer, web browsing, etc.), with realisticdistributions and parameters.packet networking; long-range dependence; queueing theory; large deviations asymptotics; buffer overflow; heavy-tailed distributions
    corecore